Haptic-geozui3d: Exploring the Use of Haptics in Auv Path Planning

نویسندگان

  • Rick Komerska
  • Colin Ware
چکیده

We have developed a desktop virtual reality system that we call Haptic-GeoZui3D, which brings together 3D user interaction and visualization to provide a compelling environment for AUV path planning. A key component in our system is the PHANTOM haptic device (SensAble Technologies, Inc.), which affords a sense of touch and force feedback – haptics – to provide cues and constraints to guide the user’s interaction. This paper describes our system, and how we use haptics to significantly augment our ability to lay out a vehicle path. We show how our system works well for quickly defining simple waypoint-towaypoint (e.g. transit) path segments, and illustrate how it could be used in specifying more complex, highly segmented (e.g. lawnmower survey) paths. INTRODUCTION AUV path planning is typically done today using either commercially available 2D visualization packages designed for surface ship hydrographic survey planning, or with custom in-house applications having limited interaction capability. Although many aspects of current AUV mission design are relatively limited in nature and capable of being handled with 2D planning tools, the requirement for specifying more complex AUV missions requiring 3D interaction is growing. For example, mapping a large-scale plume event will likely require the planning of complimentary routes for multiple AUVs working together within the 3D water column. Given this need, it would seem that working in a 3D virtual reality (VR) environment would provide a more intuitive and natural setting for the AUV mission planner. Experience has shown though that interacting in 3D VR environments is difficult. One problem is that many 3D environments, notably the large CAVE type immersive environments, lack the high-resolution stereo imagery that enables good depth perception. Another major problem is that many 3D environments do not provide haptic feedback to the user. As human beings though we rely heavily upon various force cues and constraints imposed by our real world environment to support our interactions. For example, we constantly employ 2D surfaces such as floors and countertops to help us to position items around us. Haptic devices, which allow fine force constraints to be applied in VR environments, now commercially exist. The question then arises, how should force be used to support user interaction in a VR environment? The answer depends upon the haptic input device and it’s role in the application. In designing a medical simulator that uses a pen-based device such as the PHANTOM to model a virtual scalpel, it is appropriate to use force feedback to mimic those physically based forces created by the contact of scalpel against human tissue. For applications in which the interaction is not so obvious, such as AUV path planning, the idea of haptically modeling task constraints offers a solution. It has long been recognized that in many user interface problems, adding task-related constraints can improve a user interface. Computer-aided design programs employ concepts such as snap-dragging, for example, to force objects to visually line up or rotate about certain fixed axes (Bier 1990). Adding force feedback enables users to feel these constraints embodied in a virtual element. Thus, for example, if a particular widget should only be allowed to rotate about a certain axis, then that constraint can be physically imposed to restrict the range of motion of the input device. In the field of teleoperations, the notion of task constraints has lead to the idea of using force feedback embodied in “virtual fixtures” to constrain a user’s motion when carrying out manual and supervisory control tasks (Sayers and Paul 1994; Stanisic, Jackson et al. 1996). There are of course many constraints inherent in real world interaction; e.g. physical objects do not in general interpenetrate each other when they come into contact. Haptic VR systems have demonstrated their capability to provide a compelling interaction environment while enhancing user productivity across several application areas, including petroleum exploration, medical training and industrial design. In the petroleum exploration industry, for example, VR and haptics have been shown to improve the speed and accuracy of seismic data analysis (McLaughlin and Orenstein 1997). Commercial firms now market procedural simulators for endoscopy and laparoscopy for surgical training, where the sense of touch plays a critical role (Tendick, Downes et al. 2000). In the field of product design and development, haptic VR systems are being used to provide a more natural and intuitive way of defining concepts in a completely digital environment (Grahl 2003). This paper describes a VR system our lab has built called Haptic Geographic Zoomable User Interface 3D, or Haptic-GeoZui3D (Komerska and Ware 2003), which is based upon this idea of using task constraints to support user interaction. We have chosen to demonstrate these ideas in an AUV path planning application because we believe that 3D haptic interaction and visualization technologies, when appropriately applied, can greatly assist and enhance a task such as this. SYSTEM DESCRIPTION Our Haptic-GeoZui3D application is built upon a visualization system and a haptics interface device, which are integrated together in a physical workspace. The visualization component we use is a modified version of our lab’s Geographic Zoomable User Interface 3D, or GeoZui3D, application (Ware, Plumlee et al. 2001). It is used within our lab as a platform for exploring basic research questions in 3D interaction and as a practical tool for analyzing bathymetric data. GeoZui3D uses OpenGL for graphics rendering, and can display stereo imagery when paired with appropriate hardware. GeoZui3D runs under Windows, Irix and Linux operating systems. We use a SensAble Technologies PHANTOM 1.0 haptic input device in our workspace. The PHANTOM was chosen because its pen interface provides a simple and intuitive pointing device that is similar in function to a mouse in a 2D environment yet provides for 3D selection and application of fine force constraints. It allows for 3 degree-of-freedom (dof) position and 3-dof orientation tracking of the pen, while providing the capability for application of 3-dof point force output. The PHANTOM 1.0 provides a haptic workspace comprising a rectangular volume 17 cm (width) by 14.5 cm (height) by 8 cm (depth). In Haptic-GeoZui3D, the visualization and haptic components are unified in a Haptic Fish Tank VR arrangement, as illustrated in Figure 1. Fish Tank VR refers to the creation of a small but high quality virtual reality that combines a number of technologies, such as head-tracking and stereo glasses, to their mutual advantage (Ware, Arthur et al. 1993). A horizontal mirror is used to superimpose virtual computer graphics imagery onto the PHANTOM workspace. The placement of the mirror also means that the PHANTOM and the user’s hand are hidden from view. However, a proxy for the pen that the user holds is shown and, because the user’s actual eye position is used to compute the computer graphics imagery, visual and haptic imagery are co-registered at all times. To accomplish this, we use a 17-inch monitor set at a 45° angle above the mirror. Stereoscopic display is provided using NuVision Technologies stereo glasses with a monitor refresh rate of 100 Hz. We also provide head-tracking capability, through the use of a Polhemus FASTRAK system with a sensor mounted to the stereo glasses. There is a high degree of synergy between the elements that comprise our Haptic Fish Tank setup. The workspace volume is similar in size to the localized workspace that we interact with in our everyday experience. GeoZui3D works particularly well in our Haptic Fish Tank because it uses what is known as center-of-workspace interaction (Ware, Plumlee et al. 2001). In this interaction style, objects are brought to and operated upon at a fixed point located conceptually at arm’s length from the user. We align this point with the center of the physical PHANTOM workspace, also at arm’s length, such that interaction in the virtual environment matches with what our body (proprioceptive) sensors tell us. Figure 2 shows the actual system in use. Mirror Stereo glasses

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Haptic Task Constraints fo 3D Interaction

We have created a haptically enabled fish tank VR that we call Haptic-GeoZui3D that utilizes a set of haptic widget and data object elements to support rapid and intuitive interaction within a large geographical data space. We leverage the center of workspace navigation metaphor with a Phantom 1.0 haptic device situated in a fish tank VR arrangement to provide a synergistic environment for deve...

متن کامل

PhD Showcase: Haptic-GIS: Exploring the Possibilities

Haptic technology, or haptics, is a tactile feedback technology that takes advantage of our sense of touch by applying forces, vibrations, and/or motions to the user through a device. Haptic enabled devices have recently gained much publicity in the computer games industry due to their ability to provide a more immersive experience. The use of haptic in the context of GIS and navigation assista...

متن کامل

Collaboration in a Mediated Haptic Environment

This paper examines an experiment in which pairs of people interact directly via a haptic interface over a network path that has significant physical distance and number of network hops. The aim of the experiment is to evaluate the use of haptics in a collaborative situation mediated by a networked virtual environment. The task of the experimental subjects was to cooperate in lifting a box toge...

متن کامل

Bridging the HASM: An OWL ontology for modeling the information pathways in haptic interfaces software

Haptics technology has received enormous attention to enhance human computer interaction. The last decade has witnessed a rapid progress in haptic application software development due to the fact that the underlying technology has become mature and has opened up novel research areas. In an attempt to organize the path between cause and effect we envision a need for a standard for haptic applica...

متن کامل

Enhanced Industrial Maintenance Work Task Planning by Using Virtual Engineering Tools and Haptic User Interfaces

Good maintainability is an essential feature for machines and processes in industry. It promotes, among others, maintenance safety, postmaintenance reliability and cost-effective maintenance by ensuring quick and easy operation and short downtime. Virtual engineering tools provide an effective way for maintainability design already during the design phase. Machine designers may not consider mai...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003